13 research outputs found

    A framework for biometric recognition using non-ideal iris and face

    Get PDF
    Off-angle iris images are often captured in a non-cooperative environment. The distortion of the iris or pupil can decrease the segmentation quality as well as the data extracted thereafter. Moreover, iris with an off-angle of more than 30° can have non-recoverable features since the boundary cannot be properly localized. This usually becomes a factor of limited discriminant ability of the biometric features. Limitations also come from the noisy data arisen due to image burst, background error, or inappropriate camera pixel noise. To address the issues above, the aim of this study is to develop a framework which: (1) to improve the non-circular boundary localization, (2) to overcome the lost features, and (3) to detect and minimize the error caused by noisy data. Non-circular boundary issue is addressed through a combination of geometric calibration and direct least square ellipse that can geometrically restore, adjust, and scale up the distortion of circular shape to ellipse fitting. Further improvement comes in the form of an extraction method that combines Haar Wavelet and Neural Network to transform the iris features into wavelet coefficient representative of the relevant iris data. The non-recoverable features problem is resolved by proposing Weighted Score Level Fusion which integrates face and iris biometrics. This enhancement is done to give extra distinctive information to increase authentication accuracy rate. As for the noisy data issues, a modified Reed Solomon codes with error correction capability is proposed to decrease intra-class variations by eliminating the differences between enrollment and verification templates. The key contribution of this research is a new unified framework for high performance multimodal biometric recognition system. The framework has been tested with WVU, UBIRIS v.2, UTMIFM, ORL datasets, and achieved more than 99.8% accuracy compared to other existing methods

    Artificial Neural Network and Savitzky Golay Derivative in Predicting Blood Hemoglobin Using Near-Infrared Spectrum

    Get PDF
    Monitoring blood hemoglobin level is essential to diagnose anaemia disease. This study aims to evaluate the capability of an artificial neural network (ANN) and Savitzky Golay (SG) pre-processing in predicting the blood hemoglobin level based on the near-infrared spectrum. The effects of the hidden neuron number and different SG pre-processing strategies were examined and discussed. ANN coupled with first order SG derivative and five hidden neurons achieved better prediction performance with root mean square error of prediction of 0.3517 g/dL and Rp2 of 0.9849 compared to the previous studies. Results depict that ANN that coupled with first order SG derivative could improve near-infrared spectroscopic analysis in predicting blood hemoglobin level, and the proposed nonlinear model outperforms linear models without variable selections. This finding suggests that the modelling strategy is promising in establishing a better relationship between the blood hemoglobin and near-infrared spectral data

    Deep learning Convolutional Neural Network for Unconstrained License Plate Recognition

    No full text
    The evolve of neural networks algorithm into deep learning convolutional neural networks seems like the next generation for object detection. This algorithm works has a significantly better accuracy and did not tied to any particular aspect ratio. License plate and traffic signs detection and recognition have a number of different applications relevant for transportation systems, such as traffic monitoring, detection of stolen vehicles, driver navigation support or any statistical research. An exponential increase in number of vehicles necessitates the use of automated systems to maintain vehicle information. The information is highly required for both management of traffic as well as reduction of crime. Number plate recognition is an effective way for automatic vehicle identification. A number of methods have been proposed, but only for particular cases and working under constraints (e.g. known text direction or high resolution). Deep learning convolutional neural networks work well especially in handles occlusion/rotation better, therefore we believe this approach is able to provide a better solution to the unconstrained license plate recognition problem

    Deep learning Convolutional Neural Network for Unconstrained License Plate Recognition

    No full text
    The evolve of neural networks algorithm into deep learning convolutional neural networks seems like the next generation for object detection. This algorithm works has a significantly better accuracy and did not tied to any particular aspect ratio. License plate and traffic signs detection and recognition have a number of different applications relevant for transportation systems, such as traffic monitoring, detection of stolen vehicles, driver navigation support or any statistical research. An exponential increase in number of vehicles necessitates the use of automated systems to maintain vehicle information. The information is highly required for both management of traffic as well as reduction of crime. Number plate recognition is an effective way for automatic vehicle identification. A number of methods have been proposed, but only for particular cases and working under constraints (e.g. known text direction or high resolution). Deep learning convolutional neural networks work well especially in handles occlusion/rotation better, therefore we believe this approach is able to provide a better solution to the unconstrained license plate recognition problem

    Artificial Neural Network and Savitzky Golay Derivative in Predicting Blood Hemoglobin Using Near-Infrared Spectrum

    Get PDF
    Monitoring blood hemoglobin level is essential to diagnose anaemia disease. This study aims to evaluate the capability of an artificial neural network (ANN) and Savitzky Golay (SG) pre-processing in predicting the blood hemoglobin level based on the near-infrared spectrum. The effects of the hidden neuron number and different SG pre-processing strategies were examined and discussed. ANN coupled with first order SG derivative and five hidden neurons achieved better prediction performance with root mean square error of prediction of 0.3517 g/dL and Rp2 of 0.9849 compared to the previous studies. Results depict that ANN that coupled with first order SG derivative could improve near-infrared spectroscopic analysis in predicting blood hemoglobin level, and the proposed nonlinear model outperforms linear models without variable selections. This finding suggests that the modelling strategy is promising in establishing a better relationship between the blood hemoglobin and near-infrared spectral data

    Iris biometric cryptography for identity document

    No full text
    Currently, it is noticed that users tend to choose shorter password as their authentication which can be easily attacked. Biometric technologies such as fingerprint scanning, voice authentication, face recognition, signature, hand geometry and iris recognition is now playing an important role especially in application related to security issue. In this work, we present an approach to generate a unique and more secure cryptographic key from iris template. The iris images are processed to produce iris template or code to be utilized for the encryption and decryption tasks. AES cryptography algorithm are employed to encrypt and decrypt the identity data. Distance metric such as hamming distance and Euclidean distance are used for the template matching identification process. Experimental results show that this system can obtain a higher security with a low false rejection or false acceptance rate

    An Improved Approach of Iris Biometric Authentication Performance and Security with Cryptography and Error Correction Codes

    No full text
    One of the most challenging parts of integrating biometrics and cryptography is the intra variation in acquired identifiers between the same users. Due to noise in the environment or different devices, features of the iris may differ when it is acquired at different time periods. This research focuses on improving the performance of iris biometric authentication and encrypting the binary code generated from the acquired identifiers. The proposed biometric authentication system incorporates the concepts of non-repudiation and privacy. These concepts are critical to the success of a biometric authentication system. Iris was chosen as the biometric identifier due to its characteristics of high accuracy and the permanent presence throughout an individual’s lifetime. This study seeks to find a method of reducing the noise and error associated with the nature of dissimilarity acquired by each biometric acquisition.  In our method, we used Reed Solomon error-correction codes to reduce dissimilarities and noise in iris data. The code is a block-based error correcting code that can be easily decoded and has excellent burst correction capabilities. Two different distance metric measurement functions were used to measure the accuracy of the iris pattern matching identification process, which are Hamming distance and weighted Euclidean distance. The experiments were conducted with the CASIA 1.0 iris database. The results showed that the False Acceptance Rate is 0%, the False Rejection Rate is 1.54%, and the Total Success Rate is 98.46%. The proposed approach appears to be more secure, as it is able to provide a low rate of false rejections and false acceptances

    Zombie survival optimization in solving university examination timetabling problem

    No full text
    Timetabling is a task of assigning a set of events into a set of resources and satisfying predefined constraints. University timetabling is one of the most stdied timetabling problems among the timetabling domains. It is also a time consuming administrative task that need to be performed in all the academic institutions as there are many constraints needed to be considered. In this study, a zombie survival optimization (ZSO) has been applied to address university examination timetabling problem. The underlying idea of ZSO is based on the foraging behavior of zombies, where zombies represent searching agents (solutions) in searching for antidote (optimal solution). There are three modes in ZSO, namely exploration mode, hunter mode and human exploitation mode where zombies explore for solutions (randomly) in exploration mode, explore towards a human (promising search region) in hunter mode and turn into human to search (exploitation) for local optimum. The ZSO is tested on Carter's university un-capacitated examination benchmark dataset and results demonstrated that ZSO is capable of producing promising quality of solutions when compared with the published methods in the literature. In fact, ZSO managed to record new best-known results on 3 instances of the dataset

    Transformer in mRNA Degradation Prediction

    No full text
    The unstable properties and the advantages of the mRNA vaccine have encouraged many experts worldwide in tackling the degradation problem. Machine learning models have been highly implemented in bioinformatics and the healthcare fieldstone insights from biological data. Thus, machine learning plays an important role in predicting the degradation rate of mRNA vaccine candidates. Stanford University has held an OpenVaccine Challenge competition on Kaggle to gather top solutions in solving the mentioned problems, and a multi-column root means square error (MCRMSE) has been used as a main performance metric. The Nucleic Transformer has been proposed by different researchers as a deep learning solution that is able to utilize a self-attention mechanism and Convolutional Neural Network (CNN). Hence, this paper would like to enhance the existing Nucleic Transformer performance by utilizing the AdaBelief or RangerAdaBelief optimizer with a proposed decoder that consists of a normalization layer between two linear layers. Based on the experimental result, the performance of the enhanced Nucleic Transformer outperforms the existing solution. In this study, the AdaBelief optimizer performs better than the RangerAdaBelief optimizer, even though it possesses Ranger’s advantages. The advantages of the proposed decoder can only be shown when there is limited data. When the data is sufficient, the performance might be similar but still better than the linear decoder if and only if the AdaBelief optimizer is used. As a result, the combination of the AdaBelief optimizer with the proposed decoder performs the best with 2.79% and 1.38% performance boost in public and private MCRMSE, respectively
    corecore